469 research outputs found

    Quantum sealed-bid auction using a modified scheme for multiparty circular quantum key agreement

    Full text link
    A feasible, secure and collusion-attack-free quantum sealed-bid auction protocol is proposed using a modified scheme for multi-party circular quantum key agreement. In the proposed protocol, the set of all (nn) bidders is grouped in to ll subsets (sub-circles) in such a way that only the initiator (who prepares the quantum state to be distributed for a particular round of communication and acts as the receiver in that round) is a member of all the subsets (sub-circles) prepared for a particular round, while any other bidder is part of only a single subset. All nn bidders and auctioneer initiate one round of communication, and each of them prepares ll copies of a (r1)\left(r-1\right)-partite entangled state (one for each sub-circle), where r=nl+1r=\frac{n}{l}+1. The efficiency and security\textcolor{blue}{{} }of the proposed protocol are critically analyzed. It is shown that the proposed protocol is free from the collusion attacks that are possible on the existing schemes of quantum sealed-bid auction. Further, it is observed that the security against collusion attack increases with the increase in ll, but that reduces the complexity (number of entangled qubits in each entangled state) of the entangled states to be used and that makes the scheme scalable and implementable with the available technologies. The additional security and scalability is shown to arise due to the use of a circular structure in place of a complete-graph or tree-type structure used earlier.Comment: 10 pages, 2 figure

    Which verification qubits perform best for secure communication in noisy channel?

    Full text link
    In secure quantum communication protocols, a set of single qubits prepared using 2 or more mutually unbiased bases or a set of nn-qubit (n2n\geq2) entangled states of a particular form are usually used to form a verification string which is subsequently used to detect traces of eavesdropping. The qubits that form a verification string are referred to as decoy qubits, and there exists a large set of different quantum states that can be used as decoy qubits. In the absence of noise, any choice of decoy qubits provides equivalent security. In this paper, we examine such equivalence for noisy environment (e.g., in amplitude damping, phase damping, collective dephasing and collective rotation noise channels) by comparing the decoy-qubit assisted schemes of secure quantum communication that use single qubit states as decoy qubits with the schemes that use entangled states as decoy qubits. Our study reveals that the single qubit assisted scheme perform better in some noisy environments, while some entangled qubits assisted schemes perform better in other noisy environments. Specifically, single qubits assisted schemes perform better in amplitude damping and phase damping noisy channels, whereas a few Bell-state-based decoy schemes are found to perform better in the presence of the collective noise. Thus, if the kind of noise present in a communication channel (i.e., the characteristics of the channel) is known or measured, then the present study can provide the best choice of decoy qubits required for implementation of schemes of secure quantum communication through that channel.Comment: 11 pages, 4 figure

    Optimasi Portofolio Resiko Menggunakan Model Markowitz MVO Dikaitkan dengan Keterbatasan Manusia dalam Memprediksi Masa Depan dalam Perspektif Al-Qur`an

    Full text link
    Risk portfolio on modern finance has become increasingly technical, requiring the use of sophisticated mathematical tools in both research and practice. Since companies cannot insure themselves completely against risk, as human incompetence in predicting the future precisely that written in Al-Quran surah Luqman verse 34, they have to manage it to yield an optimal portfolio. The objective here is to minimize the variance among all portfolios, or alternatively, to maximize expected return among all portfolios that has at least a certain expected return. Furthermore, this study focuses on optimizing risk portfolio so called Markowitz MVO (Mean-Variance Optimization). Some theoretical frameworks for analysis are arithmetic mean, geometric mean, variance, covariance, linear programming, and quadratic programming. Moreover, finding a minimum variance portfolio produces a convex quadratic programming, that is minimizing the objective function ðð¥with constraintsð ð 𥠥 ðandð´ð¥ = ð. The outcome of this research is the solution of optimal risk portofolio in some investments that could be finished smoothly using MATLAB R2007b software together with its graphic analysis

    An embedding technique to determine ττ backgrounds in proton-proton collision data

    Get PDF
    An embedding technique is presented to estimate standard model tau tau backgrounds from data with minimal simulation input. In the data, the muons are removed from reconstructed mu mu events and replaced with simulated tau leptons with the same kinematic properties. In this way, a set of hybrid events is obtained that does not rely on simulation except for the decay of the tau leptons. The challenges in describing the underlying event or the production of associated jets in the simulation are avoided. The technique described in this paper was developed for CMS. Its validation and the inherent uncertainties are also discussed. The demonstration of the performance of the technique is based on a sample of proton-proton collisions collected by CMS in 2017 at root s = 13 TeV corresponding to an integrated luminosity of 41.5 fb(-1).Peer reviewe

    Measurement of t(t)over-bar normalised multi-differential cross sections in pp collisions at root s=13 TeV, and simultaneous determination of the strong coupling strength, top quark pole mass, and parton distribution functions

    Get PDF
    Peer reviewe

    Electroweak production of two jets in association with a Z boson in proton-proton collisions root s =13 TeV

    Get PDF
    A measurement of the electroweak (EW) production of two jets in association with a Z boson in proton-proton collisions at root s = 13 TeV is presented, based on data recorded in 2016 by the CMS experiment at the LHC corresponding to an integrated luminosity of 35.9 fb(-1). The measurement is performed in the lljj final state with l including electrons and muons, and the jets j corresponding to the quarks produced in the hard interaction. The measured cross section in a kinematic region defined by invariant masses m(ll) > 50 GeV, m(jj) > 120 GeV, and transverse momenta P-Tj > 25 GeV is sigma(EW) (lljj) = 534 +/- 20 (stat) fb (syst) fb, in agreement with leading-order standard model predictions. The final state is also used to perform a search for anomalous trilinear gauge couplings. No evidence is found and limits on anomalous trilinear gauge couplings associated with dimension-six operators are given in the framework of an effective field theory. The corresponding 95% confidence level intervals are -2.6 <cwww/Lambda(2) <2.6 TeV-2 and -8.4 <cw/Lambda(2) <10.1 TeV-2. The additional jet activity of events in a signal-enriched region is also studied, and the measurements are in agreement with predictions.Peer reviewe

    Search for supersymmetry in events with one lepton and multiple jets in proton-proton collisions at root s=13 TeV

    Get PDF
    Peer reviewe

    Bose-Einstein correlations of charged hadrons in proton-proton collisions at s\sqrt s = 13 TeV

    Get PDF
    Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s \sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s \sqrt{s} = 7 TeV, as well as with theoretical predictions.[graphic not available: see fulltext]Bose-Einstein correlations of charged hadrons are measured over a broad multiplicity range, from a few particles up to about 250 reconstructed charged hadrons in proton-proton collisions at s=\sqrt{s} = 13 TeV. The results are based on data collected using the CMS detector at the LHC during runs with a special low-pileup configuration. Three analysis techniques with different degrees of dependence on simulations are used to remove the non-Bose-Einstein background from the correlation functions. All three methods give consistent results. The measured lengths of homogeneity are studied as functions of particle multiplicity as well as average pair transverse momentum and mass. The results are compared with data from both CMS and ATLAS at s=\sqrt{s} = 7 TeV, as well as with theoretical predictions

    Search for dark matter in events with a leptoquark and missing transverse momentum in proton-proton collisions at 13 TeV

    Get PDF
    A search is presented for dark matter in proton-proton collisions at a center-of-mass energy of root s= 13 TeV using events with at least one high transverse momentum (p(T)) muon, at least one high-p(T) jet, and large missing transverse momentum. The data were collected with the CMS detector at the CERN LHC in 2016 and 2017, and correspond to an integrated luminosity of 77.4 fb(-1). In the examined scenario, a pair of scalar leptoquarks is assumed to be produced. One leptoquark decays to a muon and a jet while the other decays to dark matter and low-p(T) standard model particles. The signature for signal events would be significant missing transverse momentum from the dark matter in conjunction with a peak at the leptoquark mass in the invariant mass distribution of the highest p(T) muon and jet. The data are observed to be consistent with the background predicted by the standard model. For the first benchmark scenario considered, dark matter masses up to 500 GeV are excluded for leptoquark masses m(LQ) approximate to 1400 GeV, and up to 300 GeV for m(LQ) approximate to 1500 GeV. For the second benchmark scenario, dark matter masses up to 600 GeV are excluded for m(LQ) approximate to 1400 GeV. (C) 2019 The Author(s). Published by Elsevier B.V.Peer reviewe
    corecore